Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 17.049
Filtrar
1.
Cureus ; 16(2): e55231, 2024 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-38558700

RESUMEN

Hypothyroidism presents various symptoms, ranging from commonly observed signs, such as fatigue, cold sensation, and constipation, to rare features, such as rash and pancytopenia, resembling certain rheumatological and hematological diseases. Chronic, excessive iodine consumption causes primary hypothyroidism. However, when iodine overconsumption becomes a regular part of daily dietary habits, it becomes difficult for patients to associate their symptoms with daily iodine consumption. Therefore, clinicians cannot obtain information on excessive iodine intake from the patient. Here, we present a case of hypothyroidism that was subsequently identified as caused by excessive dairy seaweed consumption for health purposes. This case report highlights the importance of a detailed dietary history in patients diagnosed with primary hypothyroidism without thyroid autoantibodies.

2.
Can J Pain ; 8(1): 2297561, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38562673

RESUMEN

Background: Despite the established efficacy of multidisciplinary chronic pain care, barriers such as inflated referral wait times and uncoordinated care further hinder patient health care access. Aims: Here we describe the evolution of a single-entry model (SEM) for coordinating access to chronic pain care across seven hospitals in Toronto and explore the impact on patient care 6 years after implementation. Methods: In 2017, an innovative SEM was implemented for chronic pain referrals in Toronto and surrounding areas. Referrals are received centrally, triaged by a clinical team, and assigned an appointment according to the level of urgency and the most appropriate care setting/provider. To evaluate the impact of the SEM, a retrospective analysis was undertaken to determine referral patterns, patient characteristics, and referral wait times over the past 6 years. Results: Implementation of an SEM streamlined the number of steps in the referral process and led to a standardized referral form with common inclusion and exclusion criteria across sites. Over the 6-year period, referrals increased by 93% and the number of unique providers increased by 91%. Chronic pain service wait times were reduced from 299 (±158) days to 176 (±103) days. However, certain pain diagnoses such as chronic pelvic pain and fibromyalgia far exceed the average. Conclusions: The results indicate that the SEM helped reduce wait times for pain conditions and standardized the referral pathway. Continued data capture efforts can help identify gaps in care to enable further health care refinement and improvement.


Contexte: Malgré l'efficacité établie des soins multidisciplinaires dans le traitement de la douleur chronique, des obstacles tels que des délais d'attente prolongés et l'absence de coordination des soins entravent davantage l'accès des patients aux services de santé.Objectifs: Nous décrivons ici l'évolution d'un modèle à entrée unique visant à coordonner l'accès aux soins pour la douleur chronique dans sept hôpitaux de Toronto. Nous examinons également l'effet de ce modèle sur les soins aux patients six ans après sa mise en œuvre.Méthodes: En 2017, un modèle à entrée unique novateur a été mis en place pour orienter les patients souffrant de douleur chronique à Toronto et dans les régions avoisinantes. Les patients sont reçus de manière centralisée, triés par une équipe clinique et un rendez-vous leur est attribué en fonction du degré d'urgence et de l'établissement de soins ou du prestataire le plus approprié.Pour évaluer l'impact du modèle à entrée unique, une analyse rétrospective a été entreprise afin de déterminer les schémas de consultation, les caractéristiques des patients et les temps d'attente pour les demandes de consultation au cours des six dernières années.Résultats: La mise en œuvre d'un modèle à entrée unique a permis de rationaliser le nombre d'étapes du processus de demande de consultation et a conduit à l'élaboration d'un formulaire de demande de consultation normalisé comprenant des critères d'inclusion et d'exclusion communs à tous les sites. Au cours de la période de six ans, le nombre de demandes de consultation a augmenté et le nombre de prestataires uniques a augmenté de 91 %.Les temps d'attente pour les services de traitement de la douleur chronique ont diminué de 299 (±158) jours à 176 (±103) jours. Cependant, certains diagnostics de douleur, comme la douleur pelvienne chronique et la fibromyalgie, dépassent de loin la moyenne.Conclusions: Les résultats indiquent que le modèle à entrée unique a contribué à réduire les temps d'attente pour les affections douloureuses et à normaliser le parcours de consultation. La poursuite des efforts de collecte des données peut aider à recenser les lacunes dans les soins, permettant ainsi une amélioration continue des soins de santé.

3.
Nutr Neurosci ; : 1-9, 2024 Apr 02.
Artículo en Inglés | MEDLINE | ID: mdl-38564407

RESUMEN

BACKGROUND: Epilepsy is a neurological disorder characterized by recurrent seizures. We aimed to investigate the association between the percentage of dietary carbohydrate intake (DCI) and epilepsy prevalence among American adults. METHODS: We analyzed the data from 9,584 adults aged 20-80 years who participated in the National Health and Nutrition Examination Survey from 2013 to 2018. Logistic regression was applied to explore the association between the percentage of DCI and epilepsy prevalence. RESULTS: A total of 146 (1.5%) individuals with epilepsy were enrolled in this study. The average age of the participants was 56.4 years, and 5,454 (56.9%) individuals were female. A high DCI was associated with an increased prevalence of epilepsy (odds ratio [OR], 4.56; 95% confidence interval [CI], 1.11-18.69; P = 0.035) after adjusting for age, sex, marital status, race/ethnicity, educational level, family income, body mass index, smoking status, drinking status, hypertension, diabetes, and cardiovascular disease. Stratified analyses indicated a positive correlation between DCI and epilepsy prevalence in adults with different characteristics. Compared with individuals in quartile 1 of DCI (<40.5%), those in quartile 4 (>55.4%) had an adjusted OR for epilepsy of 1.72 (95% CI, 1.09-2.73, P = 0.02, P for trend = 0.012). CONCLUSIONS: A high percentage of DCI was associated with an increased prevalence of epilepsy. The risk of epilepsy increased 3.5-fold with a 1% increase in DCI. These results suggest an important role of DCI in the dietary management of epilepsy.

4.
Artículo en Inglés | MEDLINE | ID: mdl-38563778

RESUMEN

Background and Objective: Hypertension and type-2 diabetes are strong risk factors for cardiovascular diseases, and their management requires lifestyle changes, including a shift in dietary habits. The consumption of salt has increased in the last decades in some countries, but its association with type-2 diabetes remains unknown. Thus, we aimed to estimate the amount of salt intake among adults with and without diabetes and to assess whether concomitant hypertension and diabetes are associated with higher salt intake. Methods: Data from 11,982 adults 35-74 years of age enrolled in the baseline of the Longitudinal Study of Adult Health-Brasil study (2008-2010) were studied. A clinical and anthropometric evaluation was performed, and their daily salt intake was estimated by the overnight 12-hr urine sodium excretion. Results: Salt intake (gram per day) was higher in participants with diabetes as compared with those without diabetes, regardless of sex (men: 14.2 ± 6.4 vs. 12.4 ± 5.6, P < 0.05; women: 10.5 ± 4.8 vs. 9.1 ± 4.1, P < 0.05). However, salt intake is high in participants with fasting glucose ≥126 mg/dL or HbA1c ≥6.5%, but not in participants with blood glucose 2 hr after the glucose tolerance test ≥200 mg/dL. When hypertension and diabetes coexisted, salt consumption was higher than among people without these conditions. The prevalence of hypertension increased with increasing salt intake in women with diabetes, but not in men with this condition. Conclusions: Our findings highlight the high consumption of salt in individuals with diabetes and/or hypertension, and the need for effective strategies to reduce salt consumption in these groups of increased risk for major cardiovascular events, especially in women.

5.
Heliyon ; 10(7): e28595, 2024 Apr 15.
Artículo en Inglés | MEDLINE | ID: mdl-38571581

RESUMEN

Background: Dietary nutrient intake contributes to urination; however, the association between dietary nutrient intake, especially that of fat, and urinary incontinence (UI) is not well understood. The most common types of UI include stress UI (SUI) and urgency UI (UUI). Objective: To investigate the potential effect(s) of dietary fat intake on UI and explore its mechanism of action in relation to body mass index (BMI). Methods: A cross-sectional survey of data from 15,121 individuals (20-85 years of age) from the National Health and Nutrition Examination Survey (2001-2008), a random population-based sample, was performed. Data regarding dietary nutrient intake were collected through 24 h dietary recall interviews. UI and covariate data were collected through in-person interviews. UI was assessed according to the American Urological Association Symptom Index. The odds ratio (OR) for SUI and UUI were calculated using multivariate logistic regression analysis. The mediation effect was estimated using observational mediation analysis. Results: Higher total fat intake was positively associated with increased odds for developing UI (OR 1.44 [95% confidence interval (CI) 1.08-1.93]). Females who consumed more saturated fatty acids (SFA), monounsaturated fatty acids (MUFA), and polyunsaturated fatty acids (PUFA) were more likely to develop SUI. BMI partially explained the association between total fat, SFA, MUFA, and PUFA and SUI; the proportions of the mediation effect of BMI were 14.7%, 13.0%, 18.7%, and 16.3%, respectively. Conclusions: Results of this study emphasize the key role of dietary fat intake in the prevalence of UI. Higher fat intake was positively associated with UI and BMI partially mediated the effect of fat intake on SUI.

6.
Front Nutr ; 11: 1337738, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38571751

RESUMEN

Introduction: Taurine has diverse valuable biological functions, including antioxidant activity and regulation of osmotic pressure. Maintaining physical fitness from middle age is important for healthy life expectancy. Although taurine administration improves muscle endurance and strength, its role in maintenance remains unclear. We aimed to clarify the longitudinal taurine intake association with fitness changes. Methods: Participants comprised men and women aged ≥40 years who participated in the third (2002-2004; Baseline) and seventh (2010-2012; Follow-up) waves of the National Institute for Longevity Sciences-Longitudinal Study of Aging (NILS-LSA) and completed a 3-day dietary weights recording survey at baseline. A table of taurine content was prepared for 751 foods (including five food groups: Seaweed; Fish and shellfish; Meat; Eggs; and Milk and dairy products) from the Standard Tables of Food Composition in Japan (1,878 foods) 2010. Four physical fitness items (knee extension muscle strength, sit-and-reach, one-leg standing with eyes closed, and maximum walking speed) were measured at baseline and follow-up. We analyzed the association of taurine intake with physical fitness change, employing a general linear model (GLM) and trend tests for baseline taurine intake and follow-up fitness change. Adjustments included baseline variables: sex, age, height, weight, educational level, self-rated health, smoking status, depressive symptoms, and clinical history. Results: The estimated average daily taurine intake (standard deviation) was 207.5 (145.6) mg/day at the baseline. When examining the association with the four physical fitness parameters, higher taurine intake positively increased the change in knee extension muscle strength (T1; 0.1, T2; 0.8, T3; 1.1 (kgf) GLM, p < 0.05; p for trend <0.05) and reduced the decline in knee extension muscle strength in the subgroup analysis of participants aged ≥65 years (T1: -1.9, T2: -1.7, T3: -0.4 kgf; GLM p < 0.05, p for trend <0.05). No relationship was found between taurine intake and the remaining three fitness factors. Conclusion: Estimation of taurine intake showed that dietary taurine intake potentially contributes to the maintenance of knee extension muscle strength over 8 years among Japanese community-dwelling middle-aged and older individuals. This is the first study to investigate the association of dietary taurine intake with muscle strength.

7.
Front Nutr ; 11: 1354459, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38571757

RESUMEN

Background: Lactating mothers are frequently at risk for nutritional deficiencies due to the physiological requirements of lactation. Throughout the world, a significant number of lactating mothers have micronutrient intake inadequacy. Evidence on micronutrient intake during lactation is limited in rural Ethiopia. Therefore, this study aimed to determine micronutrient intake inadequacy and associated factors among lactating mothers. Methods and materials: A community-based cross-sectional study was conducted from February 1 to 18, 2023, among lactating mothers in rural areas of the North Mecha District of Amhara Region, Northwest Ethiopia. A multistage sampling technique was used to select 449 study participants. An interviewer-administered questionnaire was employed to collect dietary intake data by using a single multiphasic interactive 24-h dietary recall. The NutriSurvey 2007 software and Ethiopia, Tanzania and Kenya food composition tables were used to calculate nutrient values for the selected 12 micronutrients. For the remainder of the analysis, SPSS version 25 was employed. The Nutrient Adequacy Ratio (NAR) and Mean Adequacy Ratio (MAR) were calculated by dividing all NAR values by the number of micronutrients computed to evaluate the nutrient intakes. A logistic regression analysis was conducted to determine the factors contributing to the overall micronutrient intake inadequacy, and statistical significance was determined at a p value <0.05. Result: A total of 430 lactating mothers participated in the study, with a 96% response rate and a mean age of 29.46 ± 5.55 years. The overall prevalence of micronutrient intake inadequacy was 72.3% (95% CI: 67.9, 76.5). The odds of micronutrient intake inadequacy were 2.5 times higher among lactating mothers aged 18-25 years old as compared to mothers in the age group ≥36 years old (AOR = 2.52, 95% CI: 1.09, 5.83). Mothers with the educational status of unable to read and write and primary school incomplete were 3.5 (AOR = 3.49, 95% CI: 1.24, 9.83) and 3.6 (AOR = 3.56, 95% CI: 1.06, 11.99) times more likely to have micronutrient intake inadequacy than mothers with secondary school completed or above educational status, respectively. Mothers whose partner's occupation was other than farming were 3.3 times more likely to have micronutrient intake inadequacy as compared to mothers whose partners were engaged in farming (AOR = 3.32, 95% CI: 1.08, 10.27). Lactating mothers who were from food-insecure households were 83% more likely to have high micronutrient intake inadequacy as compared to lactating mothers from food-secure households (AOR = 1.83, 95% CI: 1.04, 3.23). Lactating mothers with nutrition-related unfavorable attitudes were 77% more likely to have inadequate intake of micronutrients compared to lactating mothers with favorable attitudes (AOR = 1.77, 95% CI: 1.07, 2.93). Conclusion: The prevalence of micronutrient intake inadequacy among lactating mothers was high. Age of the mothers, educational status of the mothers, occupation of the partner, household food security, and nutrition-related attitude were significantly associated with micronutrient intake inadequacy. Community driven nutrition education and interventions are needed to address the high micronutrient intake inadequacy among lactating mothers in rural Ethiopia.

8.
EClinicalMedicine ; 71: 102572, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38572081

RESUMEN

Background: Ultra-processed foods (UPFs) are emerging as a risk factor for colorectal cancer (CRC), yet how post-diagnostic UPF intake may impact CRC prognosis remains unexplored. Methods: Data collected from food frequency questionnaires were used to estimate intakes of total UPFs and UPF subgroups (serving/d) at least 6 months but less than 4 years post-diagnosis among 2498 patients diagnosed with stages I-III CRC within the Nurses' Health Study and Health Professionals Follow-up Study during 1980-2016. Hazard ratios (HR) and 95% confidence intervals (CIs) of all-cause, CRC- and cardiovascular disease (CVD)-specific mortality in association with UPF consumption were estimated using an inverse probability weighted multivariable Cox proportional hazards regression model, adjusted for confounders. Findings: The mean (SD) age of patients at diagnosis was 68.5 (9.4) years. A total of 1661 deaths were documented, including 321 from CRC and 335 from CVD. Compared to those in the lowest quintile (median = 3.6 servings/d), patients in the highest quintile (median = 10 servings/d) of post-diagnostic UPF intake had higher CVD mortality (HR = 1.65, 95% CI = 1.13-2.40) but not CRC or all-cause mortality. Among UPF subgroups, higher consumption of fats/condiments/sauces was associated with a higher risk of CVD-specific mortality (highest vs. lowest quintile of intake, HR = 1.96, 95% CI = 1.41-2.73), and higher intake of ice cream/sherbet was associated with an increased risk of CRC-specific mortality (highest vs. lowest quintile, HR = 1.86, 95% CI: 1.33-2.61). No statistically significant association was found between UPF subgroups and overall mortality. Interpretation: Higher post-diagnostic intake of total UPFs and fats/condiments/sauces in CRC survivors is associated with higher CVD mortality, and higher ice cream/sherbet intake is linked to higher CRC mortality. Funding: US National Institutes of Health and the American Cancer Society.

9.
J Nutr Sci ; 13: e12, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38572364

RESUMEN

This study aimed to compare the differences in the intake of food groups and nutrients between Japanese adults who consumed the recommended daily vegetable intake (350 g/day) and those who did not. Dietary information was obtained from one-day dietary records collected from the 2016 National Health and Nutrition Survey, which was conducted in 46 prefectures in Japan. The participants aged ≥20 years (n = 21,606; 53.8% women) were classified into the < and ≥350 g/day groups. Inter-group differences for 17 food groups and 27 nutrients were assessed as percentages of consumers (food groups only) and energy-adjusted intake (units/MJ/d or % of total energy intake). Overall, 29% of participants consumed ≥350 g/day of vegetables. The ≥350 g/day group had a higher percentage of consumers and energy-adjusted intakes for all vegetable subgroups than the <350 g/day group. For other food groups, the ≥350 g/day group had higher percentages of consumers for all food groups, except for cereals, eggs, and condiments and seasonings, which showed no significant differences. However, the ≥350 g/day group had a significantly higher energy-adjusted intake for potatoes and other tubers, mushrooms, meats, and condiments and seasonings but a significantly lower value for cereals, eggs, savoury snacks and confectionaries, and beverages. The ≥350 g/day group had a significantly higher intake of almost all (25/27) nutrients, including sodium, than the <350 g/day group. Participants with vegetable intake ≥350 g/day might have a more favourable intake of food groups and nutrients; however, watching for salt intake is necessary when promoting vegetable intake.


Asunto(s)
Ingestión de Energía , Verduras , Adulto , Humanos , Femenino , Masculino , Japón , Ingestión de Alimentos , Encuestas Nutricionales
10.
Front Endocrinol (Lausanne) ; 15: 1328748, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38572474

RESUMEN

Background: In observational studies, the relationship between coffee intake and bone mineral density (BMD) is contradictory. However, residual confounding tends to bias the results of these studies. Therefore, we used a two-sample Mendelian randomization (MR) approach to further investigate the potential causal relationship between the two. Methods: Genetic instrumental variables (IVs) associated with coffee intake were derived from genome-wide association studies (GWAS) of the Food Frequency Questionnaire (FFQ) in 428,860 British individuals and matched using phenotypes in PhenoScanner. Summarized data on BMD were obtained from 537,750 participants, including total body BMD (TB-BMD), TB-BMD in five age brackets ≥60, 45-60, 30-45, 15-30, and 0-15 years, and BMD in four body sites: the lumbar spine, the femoral neck, the heel, and the ultradistal forearm. We used inverse variance weighting (IVW) methods as the primary analytical method for causal inference. In addition, several sensitivity analyses (MR-Egger, Weighted median, MR-PRESSO, Cochran's Q test, and Leave-one-out test) were used to test the robustness of the results. Results: After Bonferroni correction, Coffee intake has a potential positive correlation with total body BMD (effect estimate [Beta]: 0.198, 95% confidence interval [Cl]: 0.05-0.35, P=0.008). In subgroup analyses, coffee intake was potentially positively associated with TB-BMD (45-60, 30-45 years) (Beta: 0.408, 95% Cl: 0.12-0.69, P=0.005; Beta: 0.486, 95% Cl: 0.12-0.85, P=0.010). In addition, a significant positive correlation with heel BMD was also observed (Beta: 0.173, 95% Cl: 0.08-0.27, P=0.002). The results of the sensitivity analysis were generally consistent. Conclusion: The results of the present study provide genetic evidence for the idea that coffee intake is beneficial for bone density. Further studies are needed to reveal the biological mechanisms and offer solid support for clinical guidelines on osteoporosis prevention.


Asunto(s)
Densidad Ósea , Café , Humanos , Densidad Ósea/genética , Estudio de Asociación del Genoma Completo , Análisis de la Aleatorización Mendeliana , Cuello Femoral
11.
Nutr Neurosci ; : 1-19, 2024 Apr 05.
Artículo en Inglés | MEDLINE | ID: mdl-38576309

RESUMEN

BACKGROUND: The bed nucleus of the stria terminalis (BNST) is a structure with a peculiar neurochemical composition involved in modulating anxietylike behavior and fear. AIM: The present study investigated the effects on the BNST neurochemical composition and neuronal structure in critical moments of the postnatal period in gestational protein-restricted male rats' offspring. METHODS: Dams were maintained during the pregnancy on isocaloric rodent laboratory chow with standard protein content [NP, 17%] or low protein content [LP, 6%]. BNST from male NP and age-matched LP offspring was studied using the isotropic fractionator method, Neuronal 3D reconstruction, dendritic-tree analysis, blotting analysis, and high-performance liquid chromatography. RESULTS: Serum corticosterone levels were higher in male LP offspring than NP rats in 14-day-old offspring, without any difference in 7-day-old progeny. The BNST total cell number and anterodorsal BNST division volume in LP progeny were significantly reduced on the 14th postnatal day compared with NP offspring. The BNST HPLC analysis from 7 days-old LP revealed increased norepinephrine levels compared to NP progeny. The BNST blot analysis from 7-day-old LP revealed reduced levels of GR and BDNF associated with enhanced CRF1 expression compared to NP offspring. 14-day-old LP offspring showed reduced expression of MR and 5HT1A associated with decreased DOPAC and DOPA turnover levels relative to NP rats. In Conclusion, the BNST cellular and neurochemical changes may represent adaptation during development in response to elevated fetal exposure to maternal corticosteroid levels. In this way, gestational malnutrition alters the BNST content and structure and contributes to already-known behavioral changes.

12.
Curr Nutr Rep ; 2024 Apr 05.
Artículo en Inglés | MEDLINE | ID: mdl-38578590

RESUMEN

Estimates of dietary intake of polyphenols have been limited to specific samples from certain population groups, and different databases have been used to quantify the levels of these compounds, which makes it difficult to compare results. PURPOSE OF REVIEW: This review collated estimates of polyphenol intake from population studies including adults and elderly from different parts of the world by using a single database: Phenol-Explorer. RECENT FINDINGS: Through seven population-based studies performed in five different countries, it was possible to identify that Brazil was the country with the lowest intake of polyphenols, whereas Poland had the highest dietary intake. The most ingested subclasses of polyphenols in different countries were phenolic acids and flavonoids, and non-alcoholic beverages (coffee, tea, and orange juice) were the foods that most contributed to the intake of polyphenols. Despite the attempt to standardize this study to obtain worldwide intake estimates that could be comparable, gaps were found regarding the assessment of food consumption, standardization to obtain the polyphenol content of foods in Phenol-Explorer, calculation in aglycone equivalents, and caloric adjustment of the estimates. There is a need for more studies on the dietary polyphenols intake of representative samples of populations from different countries to collate more data on the quantities consumed and the main contributing foods.

13.
Ecotoxicol Environ Saf ; 276: 116305, 2024 Apr 09.
Artículo en Inglés | MEDLINE | ID: mdl-38599158

RESUMEN

The heavy metal(loid)s (HMs) in soils can be accumulated by crops grown, which is accompanied by crop ingestion into the human body and then causes harm to human health. Hence, the health risks posed by HMs in three crops for different populations were assessed using Health risk assessment (HRA) model coupled with Monte Carlo simulation. Results revealed that Zn had the highest concentration among three crops; while Ni was the main polluting element in maize and soybean, and As in rice. Non-carcinogenic risk for all populations through rice ingestion was at an "unacceptable" level, and teenagers suffered higher risk than adults and children. All populations through ingestion of three crops might suffer Carcinogenic risk, with the similar order of Total carcinogenic risk (TCR): TCRAdults > TCRTeenagers > TCRChildren. As and Ni were identified as priority control HMs in this study area due to their high contribution rates to health risks. According to the HRA results, the human health risk was associated with crop varieties, HM species, and age groups. Our findings suggest that only limiting the Maximum allowable intake rate is not sufficient to prevent health risks caused by crop HMs, thus more risk precautions are needed.

14.
Artículo en Inglés | MEDLINE | ID: mdl-38598120

RESUMEN

Aflatoxin (AF) poisoning of staple foods, such as rice, is caused by fungal contamination by Aspergillus species. These AFs are genotoxic, carcinogenic and suppress the immune system. Hence, the present study was conducted to elucidate the prevalence of AF contamination in rice samples collected from local markets of Hyderabad, Telangana, India. The rice samples collected were analysed for AF by using HPLC-fluorescence detection (HPLC-FLD). Based on AF contamination levels and dietary intake of rice, the health risk was assessed by the margin of exposure (MOE) and liver cancer risk in adults, adolescence and children. The percentage detected contamination with AFB1 and AFB2 of rice samples was 54% and 34%, with the concentration ranging between 0-20.35 µg/kg and 0-1.54 µg/kg, respectively. Three rice samples exceeded the Food Safety and Standards Authority of India (FSSAI) total AF acceptable limit of 15 µg/kg. The average MOE values were 53.73, 50.58 and 35.69 (all <10,000) for adults, adolescence and children, respectively. The average liver cancer risk associated with rice consumption in the population of Hyderabad was found to be 0.27, 0.28 and 0.40 hepatocellular carcinoma (HCC) cases/year/100,000 individuals in adults, adolescence and children, respectively. This study revealed an adverse health risk to population of Hyderabad due to consumption of AF contaminated rice.

15.
Artículo en Inglés | MEDLINE | ID: mdl-38588574

RESUMEN

Purpose: Dietary phytochemicals have been under examination as adjuvants for the prevention and treatment of obesity and diabetes. This study aimed at examining the potential associations of dietary "Phytochemical Index" (PI) and polyphenol intake with obesity and diabetes-related parameters. Materials and Methods: The case-control study involved 331 participants (156 overweight/obese and 175 normal weight), aged 18-50 years. Dietary intake was assessed using the 24-hr dietary recall method, and the PI score was calculated as the percentage of energy intake provided by phytochemical-rich foods. Polyphenol intakes were calculated using Phenol-Explorer and U.S. Department of Agriculture databases. Anthropometrical measurements were taken, serum glucose, insulin, and lipid profiles were analyzed, homeostatic model assessment for insulin resistance (HOMA-IR) was calculated, and blood pressure was measured. Linear regression analyses were used to examine the potential associations. Results: Participants with higher PI scores had higher total and some sub-classes polyphenol intakes compared with lower ones (P < 0.05, for each). Dietary PI score was not associated with any of the anthropometrical measurements; however, total polyphenol and flavonoids intakes were inversely associated with body mass index (ß = -0.269, P = 0.049; ß = -0.262, P = 0.048; respectively), waist circumference (ß = -0.127, P = 0.021; ß = -0.130, P = 0.016; respectively), and waist-to-hip ratio (ß = -20.724, P = 0.032; ß = -22.199, P = 0.018; respectively) after adjusting for potential confounders. Either dietary PI score or total and sub-class polyphenol intakes were not associated with a better metabolic profile, except for the lignan intake, which was inversely associated with HOMA-IR (ß = -0.048, P = 0.011). Conclusions: Higher dietary polyphenol intake may have potential in the prevention of obesity and diabetes, and validated practical tools are essential for the assessment of polyphenol intake in clinical practice.

16.
Hypertens Res ; 2024 Apr 08.
Artículo en Inglés | MEDLINE | ID: mdl-38589606

RESUMEN

Non-communicable diseases (NCDs) cause a significant global health challenge, with unhealthy diets identified as a major risk factor. Sodium and potassium, which are essential minerals for human health, play important roles in various bodily functions, and an imbalance in their intake can have significant health implications, particularly concerning hypertension and cardiovascular diseases. This review compiles dietary sodium and potassium intake recommendations from prominent global health organizations and compares global guidelines to Japan's Dietary Reference Intake (DRI) guidelines. Sodium and potassium intake guidelines from organizations such as the World Health Organization (WHO), American College of Cardiology (ACC) and American Heart Association (AHA), Dietary Guidelines for Americans (DGA), European Food Safety Authority (EFSA), and DRI for Japanese exhibit variations. Compared to other Asian countries, Japan's historically higher sodium goal aligns with Southeast Asia where traditional preserved foods contribute to high sodium intake. Contrarily, Japan's lower potassium goal contrasts with other countries in Asia promoting a diet rich in fruits and vegetables. The ongoing effort by Japan to align with global recommendations reflects a gradation approach considering social habits. While harmonizing international efforts is essential, appreciating regional diversities is paramount through tailoring guidelines to cultural and dietary habit practices. Implementing context-specific guidelines informed by scientific research can contribute to global efforts in promoting healthy diets and reducing the burden of NCDs. Global guidelines that recommended the daily dietary intake goal for sodium and potassium exhibit variations. These disparities are influenced by diverse factors, including cultural dietary habits, socioeconomic status, health priorities, and available scientific research. Each population should follow the recommendations of their region.

18.
Front Nutr ; 11: 1287237, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38585614

RESUMEN

Background: The literature brings to light the unhealthy nutritional habits prevalent among Asian adolescents and their high level of body image dissatisfaction. This study aims to conduct a systematic review of the literature on the effect of nutritional education interventions on their nutritional knowledge and food intake behavior, attitude, practice, and body image. Methods: We searched relevant published studies in PubMed, Web of Science, Scopus, Science Direct, and Springer using the PICO framework and performed a quality assessment using the 10-point checklist adapted from the National Institutes for Health tool. Results: The majority of the nutritional education interventions improve unhealthy food intake and body image misperception, particularly on nutritional knowledge/self-efficacy, healthy dietary habits, physical activities, and fruit and vegetable intake. We also found a negative association with excess weight gain, obesity, and unethical weight reduction practices, leading to dissatisfaction with body image. Conclusion: These interventions can help address dietary problems and body image perception and support the development of future interventions.

19.
J Agric Food Chem ; 72(14): 8200-8213, 2024 Apr 10.
Artículo en Inglés | MEDLINE | ID: mdl-38560889

RESUMEN

Zearalenone (ZEN) is a mycotoxin that is harmful to humans and animals. In this study, female and male rats were exposed to ZEN, and the results showed that ZEN reduced the farnesoid X receptor (FXR) expression levels in the liver and disrupted the enterohepatic circulation of bile acids (BAs). A decrease in food intake induced by ZEN was negatively correlated with an increase in the level of total BAs. BA-targeted metabolomics revealed that ZEN increased glycochenodeoxycholic acid levels and decreased the ratio of conjugated BAs to unconjugated BAs, which further increased the hypothalamic FXR expression levels. Preventing the increase in total BA levels induced by ZEN via Lactobacillus rhamnosus GG intervention restored the appetite. In conclusion, ZEN disrupted the enterohepatic circulation of BAs to decrease the level of food intake. This study reveals a possible mechanism by which ZEN affects food intake and provides a new approach to decrease the toxic effects of ZEN.


Asunto(s)
Ácidos y Sales Biliares , Zearalenona , Humanos , Ratas , Masculino , Femenino , Animales , Ácidos y Sales Biliares/metabolismo , Zearalenona/metabolismo , Hígado/metabolismo , Hipotálamo , Ingestión de Alimentos
20.
Artículo en Inglés | MEDLINE | ID: mdl-38565388

RESUMEN

While there is extensive research on alcohol dependence, the factors that make an individual vulnerable to developing alcoholism haven't been explored much. In this study, we aim to investigate how neonatal exposure to sex hormones affects alcohol intake and the regulation of the mesolimbic pathway in adulthood. The study aimed to investigate the impact of neonatal exposure to a single dose of testosterone propionate (TP) or estradiol valerate (EV) on ethanol consumption in adult rats. The rats were subjected to a two-bottle free-choice paradigm, and the content of dopamine (DA) and 3,4-dihydroxyphenylacetic acid (DOPAC) in the nucleus accumbens (NAcc) was measured using HPLC-ED. The expression of critical DA-related proteins in the mesolimbic pathway was evaluated through RT-qPCR and western blot analysis. Supraphysiological neonatal exposure to EV or TP resulted in increased ethanol intake over four weeks in adulthood. In addition, the DA and DOPAC content was reduced and increased in the NAcc of EV and TP-treated rats, and ß-endorphin content in the hypothalamus decreased in EV-treated rats. The VTA µ receptor and DA type 2 form short receptor (D2S) expression were significantly reduced in EV and TP male rats. Finally, in an extended 6-week protocol, the increase in ethanol consumption induced by EV was mitigated during the initial two hours post-naloxone injection. Neonatal exposure to sex hormones is a detrimental stimulus for the brain, which can facilitate the development of addictive behaviors, including alcohol use disorder.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...